Journal article
Latent Relations at Steady-state with Associative Nets
KD Shabahang, H Yim, SJ Dennis
Cognitive Science | Published : 2024
DOI: 10.1111/cogs.13494
Abstract
Models of word meaning that exploit patterns of word usage across large text corpora to capture semantic relations, like the topic model and word2vec, condense word-by-context co-occurrence statistics to induce representations that organize words along semantically relevant dimensions (e.g., synonymy, antonymy, hyponymy, etc.). However, their reliance on latent representations leaves them vulnerable to interference, makes them slow learners, and commits to a dual-systems account of episodic and semantic memory. We show how it is possible to construct the meaning of words online during retrieval to avoid these limitations. We implement a spreading activation account of word meaning in an asso..
View full abstractRelated Projects (1)
Grants
Awarded by National Research Foundation of Korea